Federated Learning with Lossy Distributed Source Coding: Analysis and Optimization

نویسندگان

چکیده

Recently, federated learning (FL), which replaces data sharing with model sharing, has emerged as an efficient and privacy-friendly machine (ML) paradigm. One of the main challenges in FL is huge communication cost for aggregation. Many compression/quantization schemes have been proposed to reduce However, following question remains unanswered: What fundamental trade-off between convergence performance? In this paper, we manage answer question. Specifically, first put forth a general framework aggregation performance analysis based on rate-distortion theory. Under framework, derive inner bound region We then conduct connect distortion performance. formulate minimization problem improve Two algorithms are developed solve above problem. Numerical results distortion, performance, demonstrate that baseline still great potential further improvement.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distributed Lossy Source Coding Using BCH-DFT Codes

Distributed source coding, separate encoding (compression) and joint decoding of statistically dependent sources, arises in an increasing number of applications like sensor networks and multiview video coding. Many of those applications are highly interactive, requiring the development of low-delay, energy-limited communication and computing schemes. Currently, this compression is performed by ...

متن کامل

Robust Distributed Source Coding with Arbitrary Number of Encoders and Practical Code Design Technique

The robustness property can be added to DSC system at the expense of reducing performance, i.e., increasing the sum-rate. The aim of designing robust DSC schemes is to trade off between system robustness and compression efficiency. In this paper, after deriving an inner bound on the rate–distortion region for the quadratic Gaussian MDC based RDSC system with two encoders, the structure of...

متن کامل

Systematic Lossy Source/Channel Coding

The fundamental limits of “systematic” communication are analyzed. In systematic transmission, the decoder has access to a noisy version of the uncoded raw data (analog or digital). The coded version of the data is used to reduce the average reproduced distortion D below that provided by the uncoded systematic link and/or increase the rate of information transmission. Unlike the case of arbitra...

متن کامل

Source Coding Optimization for Distributed Average Consensus

PILGRIM, RYAN ZACHARY. Source Coding Optimization for Distributed Average Consensus. (Under the direction of Dror Baron.) Consensus is a common method for computing a function of the data distributed among the nodes of a network. Of particular interest is distributed average consensus, whereby the nodes iteratively compute the sample average of the data stored at all the nodes of the network us...

متن کامل

Stochastic, Distributed and Federated Optimization for Machine Learning

We study optimization algorithms for the finite sum problems frequently arising in machine learning applications. First, we propose novel variants of stochastic gradient descent with a variance reduction property that enables linear convergence for strongly convex objectives. Second, we study distributed setting, in which the data describing the optimization problem does not fit into a single c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Communications

سال: 2023

ISSN: ['1558-0857', '0090-6778']

DOI: https://doi.org/10.1109/tcomm.2023.3277882